Skip to content

[Spyre-Next] Custom fixtures for upstream tests#884

Open
joerunde wants to merge 4 commits intotorch-spyre:mainfrom
joerunde:custom-fixtures
Open

[Spyre-Next] Custom fixtures for upstream tests#884
joerunde wants to merge 4 commits intotorch-spyre:mainfrom
joerunde:custom-fixtures

Conversation

@joerunde
Copy link
Copy Markdown
Collaborator

@joerunde joerunde commented Apr 1, 2026

Description

This PR adds the ability to specify custom fixtures to be injected for sets of upstream tests. This allows us to run custom setup and teardown where we may need to patch things like hardcoded inputs that assume cuda.

Eventually we will update the upstream tests to be more portable as part of a larger effort, but having custom fixtures allows us to run some upstream tests now so we don't have to wait until we've finished the work to fix the upstream tests. This is important for #853, which currently is vendoring upstream test files and editing them. This PR includes an example of a fixture to make those set of tests usable as-is.

This PR depends on #853, once that lands I will update the diff here to remove the duplicated test files.

Related Issues

Removes duplicated test files that will land in #853

Test Plan

bot:next-test to ensure the upstream tests work 😄

Checklist

  • I have read the contributing guidelines
  • My code follows the project's code style (run bash format.sh)
  • I have added tests for my changes (if applicable)
  • I have updated the documentation (if applicable)
  • My commits include a Signed-off-by: line (DCO compliance)

joerunde added 2 commits April 1, 2026 08:27
Signed-off-by: Joe Runde <joe@joerun.de>
Signed-off-by: Joe Runde <joe@joerun.de>
@github-actions
Copy link
Copy Markdown

github-actions Bot commented Apr 1, 2026

👋 Hi! Thank you for contributing to vLLM support on Spyre.
Just a reminder: Make sure that your code passes all the linting checks, otherwise your PR won't be able to be merged. To do so, run ./format.sh.
Now you are good to go 🚀.

We also recommend installing prek and configuring it to check your code before every local commit.

@github-actions github-actions Bot changed the title Custom fixtures for upstream tests [Spyre-Next] Custom fixtures for upstream tests Apr 1, 2026
return tests_dir


def _spicy_code_edits(upstream_tests_dir: Path):
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

NIT: "spicy" adds some good flavor, but maybe "_temp_upstream_workarounds" or "_fixme_code_edits" or something like that would be clearer.

"""

# Mocking out torch.device seems impossible to do (at least multiple rounds of Bob and Claude
# were unsuccessful). So we patch the source code to change the hardcoded
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

My AI questioning tells me that torch.device is a native C++ type defined by torch and that is why it is not possible to monkeypatch (some code using torch.device may even resolve fully at import time):
https://github.com/pytorch/pytorch/blob/main/torch/csrc/Device.cpp#L242

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yup, that's what I was running into. For some cases it's possible to patch it with a simple function, but there are a lot of usages likeisinstance(thing, torch.device) that expect torch.device to still be a class which I couldn't find a way to work around :/

Copy link
Copy Markdown
Collaborator

@bohnstingl bohnstingl left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I like the idea of this PR a lot!
I haven't yet tested it myself, but @joerunde is this supposed to be working right now and can I try it, or are there limitations I should know about?

Comment on lines +641 to +659
"small_decode": test_module.BatchSpec(seq_lens=[40], query_lens=[1]),
"small_prefill": test_module.BatchSpec(seq_lens=[40], query_lens=[8]),
"mixed_small": test_module.BatchSpec(seq_lens=[48], query_lens=[5]),
"medium_decode": test_module.BatchSpec(
seq_lens=[1024],
query_lens=[1],
),
"medium_prefill": test_module.BatchSpec(seq_lens=[1024], query_lens=[16]),
"mixed_medium": test_module.BatchSpec(seq_lens=[2048], query_lens=[1]),
"large_decode": test_module.BatchSpec(seq_lens=[2048], query_lens=[1]),
"large_prefill": test_module.BatchSpec(seq_lens=[4096], query_lens=[32]),
"mixed_large": test_module.BatchSpec(seq_lens=[4096], query_lens=[32]),
"single_decode": test_module.BatchSpec(seq_lens=[1024], query_lens=[1]),
"single_prefill": test_module.BatchSpec(seq_lens=[1024], query_lens=[64]),
# encoder-only
"small_encoder_prefill": test_module.BatchSpec(seq_lens=[32], query_lens=[32]),
"medium_encoder_prefill": test_module.BatchSpec(seq_lens=[256], query_lens=[256]),
}
monkeypatch.setattr(test_module, "BATCH_SPECS", our_batch_specs)
Copy link
Copy Markdown
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think these patches may not be necessary anymore, as the attention implementation from #853 does support batch_size > 1, right?

Copy link
Copy Markdown
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice! It was failing when I wrote this but I see it's been updated to support it now

@joerunde
Copy link
Copy Markdown
Collaborator Author

joerunde commented Apr 6, 2026

I like the idea of this PR a lot! I haven't yet tested it myself, but @joerunde is this supposed to be working right now and can I try it, or are there limitations I should know about?

@bohnstingl the changes here should be working as-is, but you'd need to rebase this on top of #853 for the new custom attention backend to actually exist and function, since that hasn't merged yet

joerunde added 2 commits April 6, 2026 15:18
Signed-off-by: Joe Runde <joe@joerun.de>
Signed-off-by: Joe Runde <joe@joerun.de>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants